Improvement of Multi-Layer Perceptron (MLP) training using optimization algorithms

نویسندگان

  • Zahra Beheshti
  • Siti Mariyam Shamsuddin
چکیده

Artificial Neural Network (ANN) is one of the modern computational methods proposed to solve increasingly complex problems in the real world (Xie et al., 2006 and Chau, 2007). ANN is characterized by its pattern of connections between the neurons (called its architecture), its method of determining the weights on the connections (called its training, or learning, algorithm), and its activation Function. Training is accomplished by presenting a sequence of training vectors, or patterns, each with an associated target output vector. Then, the weights are adjusted based on a learning algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Hybrid model of Multi-layer Perceptron Artificial Neural Network and Genetic Algorithms in Web Design Management Based on CMS

The size and complexity of websites have grown significantly during recent years. In line with this growth, the need to maintain most of the resources has been intensified. Content Management Systems (CMSs) are software that was presented in accordance with increased demands of users. With the advent of Content Management Systems, factors such as: domains, predesigned module’s development, grap...

متن کامل

Classification of ECG signals using Hermite functions and MLP neural networks

Classification of heart arrhythmia is an important step in developing devices for monitoring the health of individuals. This paper proposes a three module system for classification of electrocardiogram (ECG) beats. These modules are: denoising module, feature extraction module and a classification module. In the first module the stationary wavelet transform (SWF) is used for noise reduction of ...

متن کامل

G-Prop-II: Global Optimization of Multilayer Perceptrons using GAs

A general problem in model selection is to obtain the right parameters that make a model fit observed data. For a Multilayer Perceptron (MLP) trained with Backpropagation (BP), this means finding the right hidden layer size, appropriate initial weights and learning parameters. This paper proposes a method (G-Prop-II) that attempts to solve that problem by combining a genetic algorithm (GA) and ...

متن کامل

G - Prop - III : Global Optimization of Multilayer Perceptrons using anEvolutionary

This paper proposes a new version of a method (G-Prop-III, genetic backpropaga-tion) that attempts to solve the problem of nding appropriate initial weights and learning parameters for a single hidden layer Mul-tilayer Perceptron (MLP) by combining a genetic algorithm (GA) and backpropagation (BP). The GA selects the initial weights and the learning rate of the network, and changes the number o...

متن کامل

Analysis of Optimization Techniques for Feed Forward Neural Networks Based Image Compression

This paper reviews various optimization techniques available for training multi-layer perception (MLP) artificial neural networks for compression of images. These optimization techniques can be classified into two categories: Derivative-based and Derivative free optimization. The former is based on the calculation of gradient and includes Gradient Descent, Conjugate gradient, Quasi-Newton, Leve...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015